chore(migration): Migrate code from googleapis/python-bigquery into packages/google-cloud-bigquery#16008
chore(migration): Migrate code from googleapis/python-bigquery into packages/google-cloud-bigquery#16008
Conversation
* chore(deps): update all dependencies * 🦉 Updates from OwlBot post-processor See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md --------- Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
* feat: add default timeout for Client.get_job() * change timeout type detection * lint * fix unit test and coverage * add type hint * fix type hint * change import style and add comments * remove sentinel value in client * type hint * typo * add sentinel for query_and_wait() * add unit tests * fix unit test * Update google/cloud/bigquery/job/query.py Co-authored-by: Tim Sweña (Swast) <swast@google.com> * Update google/cloud/bigquery/job/query.py Co-authored-by: Tim Sweña (Swast) <swast@google.com> * address comments * typo * type hint * typos --------- Co-authored-by: Tim Sweña (Swast) <swast@google.com>
This updates tests to use `max_iterations` rather than `max_iteration` which was an alpha option. Related: b/344469351
…thon (#1941) Updates the regular continuous CI/CD checks to test against specific versions of Python (versions that aren't our most recent supported version and aren't our oldest supported version). Also removes a CI/CD check that is superceded by a more recent version of check (prerelease-deps >>> replaced by prerelease-deps-3.12). Modifies owlbot to avoid it adding prerelease-deps back into the mix since that file is a default in synthtool.
…use to download first page of results (#1942) * perf: if `page_size` or `max_results` is set on `QueryJob.result()`, use to download first page of results * add unit tests for query_and_wait * populate maxResults on page 2 * fix maxResults * fix coverage --------- Co-authored-by: Lingqing Gan <lingqing.gan@gmail.com>
* fix: create query job in job.result() if doesn't exist * Apply suggestions from code review --------- Co-authored-by: Tim Sweña (Swast) <swast@google.com>
Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
…1949) * test: update the results of test based on change to hacker news data * Update tests/system/test_client.py --------- Co-authored-by: Lingqing Gan <lingqing.gan@gmail.com>
* chore(deps): update all dependencies * 🦉 Updates from OwlBot post-processor See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md --------- Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com> Co-authored-by: Lingqing Gan <lingqing.gan@gmail.com>
* chore(deps): update all dependencies * 🦉 Updates from OwlBot post-processor See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md * Update samples/geography/requirements.txt * Update samples/geography/requirements.txt --------- Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com> Co-authored-by: Chalmer Lowe <chalmerlowe@google.com>
* feat: support load job option ColumnNameCharacterMap * add unit test
…set (#1956) * fix: do not overwrite page_size with max_results when start_index is set * update test
Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
Co-authored-by: Lingqing Gan <lingqing.gan@gmail.com>
* chore(deps): update all dependencies * 🦉 Updates from OwlBot post-processor See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md --------- Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
Co-authored-by: Lingqing Gan <lingqing.gan@gmail.com>
#1972) Fixes constraints file to match setup.py
* feat: use `bigquery-magics` package for the `%%bigquery` magic * ignore types on bigquery-magics package * Update samples/magics/noxfile_config.py Co-authored-by: Chalmer Lowe <chalmerlowe@google.com> --------- Co-authored-by: Chalmer Lowe <chalmerlowe@google.com>
* chore: update templated files * remove obsolete code in owlbot.py * 🦉 Updates from OwlBot post-processor See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md --------- Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
….3 (#1976) * fix: Allow protobuf 5.x; require protobuf >=3.20.2; proto-plus >=1.22.3 * Update constraints --------- Co-authored-by: Lingqing Gan <lingqing.gan@gmail.com>
* docs: add short mode query sample & test
Source-Link: googleapis/synthtool@bef813d Post-Processor: gcr.io/cloud-devrel-public-resources/owlbot-python:latest@sha256:94bb690db96e6242b2567a4860a94d48fa48696d092e51b0884a1a2c0a79a407 Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
Co-authored-by: Lingqing Gan <lingqing.gan@gmail.com>
* fix: add warning when encountering unknown field types The types returned for currently unsupported field types may change in the future, when support is added. Warn users that the types they are using are not yet supported. * fix: add warning for unknown subfield types as well * fix: remove unused warnings * fix: remove leftover debugging code * move test case closer to related test * add comments * fix formatting * fix test_table and use warnings.warn instead of pytest.warn * add explicit warning about behavior subject to change in the future add warning for write and warn about future behavior changes * add default converter for _SCALAR_VALUE_TO_JSON_PARAM * factor out shared warning * fix test case and make coverage happy * add unit test to StructQueryParameter class --------- Co-authored-by: Lingqing Gan <lingqing.gan@gmail.com>
* revises Exception type * updates error choices
…it (#1995) * adjusts location of checks related to docfx/docs * 🦉 Updates from OwlBot post-processor See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md --------- Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
* chore(deps): update all dependencies * 🦉 Updates from OwlBot post-processor See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md * pin PyYAML===6.0.1 for python 3.7 * fix the version pinning --------- Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com> Co-authored-by: Lingqing Gan <lingqing.gan@gmail.com>
PR created by the Librarian CLI to initialize a release. Merging this PR will auto trigger a release. Librarian Version: v0.8.0 Language Image: us-central1-docker.pkg.dev/cloud-sdk-librarian-prod/images-prod/python-librarian-generator@sha256:c8612d3fffb3f6a32353b2d1abd16b61e87811866f7ec9d65b59b02eb452a620 <details><summary>google-cloud-bigquery: 3.40.1</summary> ## [3.40.1](https://togithub.com/googleapis/python-bigquery/compare/v3.40.0...v3.40.1) (2026-02-12) ### Bug Fixes * updates timeout/retry code to respect hanging server (#2408) ([24d45d0d](https://togithub.com/googleapis/python-bigquery/commit/24d45d0d)) * add timeout parameter to to_dataframe and to_arrow met… (#2354) ([4f67ba20](https://togithub.com/googleapis/python-bigquery/commit/4f67ba20)) ### Documentation * clarify that only jobs.query and jobs.getQueryResults are affec… (#2349) ([73228432](https://togithub.com/googleapis/python-bigquery/commit/73228432)) </details>
… into migration.python-bigquery.migration.2026-03-02_16-59-45.migrate
Summary of ChangesHello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request completes a significant architectural change by migrating the Google Cloud BigQuery Python client library into a dedicated package directory. This restructuring aims to consolidate the project's codebase, streamline its development lifecycle, and integrate it more effectively into a monorepo environment. The migration includes all source code, documentation, and continuous integration configurations, ensuring that the package is fully functional and independently manageable within its new location. Highlights
Changelog
Activity
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Code Review
This pull request includes changes to the google-cloud-bigquery package, including updates to issue tracker URLs, test matrix configurations, Kokoro build file paths, repository names in contributing guides, and version-specific warnings. It also addresses potential issues with the coverage combine command and flakybot execution in Kokoro. The review comments suggest updating hardcoded URLs to reflect the new monorepo structure, removing premature Python versions from test matrices, improving the robustness of coverage reporting, and ensuring the existence of flakybot before execution. Additionally, the repo field in .repo-metadata.json and URLs in the contributing guide are updated to reflect the new package location, and the version in the contributing guide is updated to be more general.
Note: Security Review is unavailable for this PR.
I am having trouble creating individual review comments. Click here to see my feedback.
packages/google-cloud-bigquery/.github/ISSUE_TEMPLATE/bug_report.md (13)
The issue tracker URL is hardcoded to the old repository name. It should be updated to reflect the new package location within the monorepo, or a more general issue tracker if applicable.
- Search the issues already opened: https://github.com/googleapis/google-cloud-python/issues?q=is%3Aissue+label%3Acomponent%3Abigquery
packages/google-cloud-bigquery/.github/PULL_REQUEST_TEMPLATE.md (2)
The URL for opening a new issue is hardcoded to the old repository name. It should be updated to reflect the new package location within the monorepo, or a more general issue creation link if applicable.
- [ ] Make sure to open an issue as a [bug/issue](https://github.com/googleapis/google-cloud-python/issues/new/choose?labels=component%3Abigquery) before writing your code! That way we can discuss the change, evaluate designs, and agree on the general idea
packages/google-cloud-bigquery/.github/workflows/unittest.yml (11)
Including Python 3.14 in the test matrix might be premature as it is not yet officially released. This could lead to unexpected build failures or instability. Consider removing it until it reaches a stable release, or explicitly target a pre-release version if that's the intent.
python: ['3.9', '3.10', '3.11', '3.12', '3.13']packages/google-cloud-bigquery/.github/workflows/unittest.yml (40)
Similar to the general unit test matrix, including Python 3.14 here might be premature. Consider removing it until it reaches a stable release.
python: ['3.9']packages/google-cloud-bigquery/.github/workflows/unittest.yml (87)
The coverage combine command might not correctly locate the .coverage files if unzip extracts them into subdirectories (e.g., .coverage-results/coverage-artifact-3.9/.coverage-3.9). It's more robust to use find to locate the files explicitly.
coverage combine $(find .coverage-results -type f -name '.coverage*')packages/google-cloud-bigquery/.kokoro/build.sh (47-48)
It's good practice to check if flakybot exists before attempting to chmod +x and execute it. This prevents potential errors if the file is missing.
if [[ -f "$KOKORO_GFILE_DIR/linux_amd64/flakybot" ]]; then
chmod +x $KOKORO_GFILE_DIR/linux_amd64/flakybot
$KOKORO_GFILE_DIR/linux_amd64/flakybot
fi
packages/google-cloud-bigquery/.kokoro/continuous/common.cfg (17)
The build_file path refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
build_file: "packages/google-cloud-bigquery/.kokoro/trampoline.sh"
packages/google-cloud-bigquery/.kokoro/continuous/common.cfg (26)
The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
value: "github/packages/google-cloud-bigquery/.kokoro/build.sh"
packages/google-cloud-bigquery/.kokoro/presubmit/common.cfg (17)
The build_file path refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
build_file: "packages/google-cloud-bigquery/.kokoro/trampoline.sh"
packages/google-cloud-bigquery/.kokoro/presubmit/common.cfg (26)
The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
value: "github/packages/google-cloud-bigquery/.kokoro/build.sh"
packages/google-cloud-bigquery/.kokoro/samples/lint/common.cfg (18)
The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
value: "github/packages/google-cloud-bigquery/.kokoro/test-samples.sh"
packages/google-cloud-bigquery/.kokoro/samples/python3.10/common.cfg (24)
The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
value: "github/packages/google-cloud-bigquery/.kokoro/test-samples.sh"
packages/google-cloud-bigquery/.kokoro/samples/python3.10/periodic-head.cfg (10)
The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
value: "github/packages/google-cloud-bigquery/.kokoro/test-samples-against-head.sh"
packages/google-cloud-bigquery/.kokoro/samples/python3.11/common.cfg (24)
The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
value: "github/packages/google-cloud-bigquery/.kokoro/test-samples.sh"
packages/google-cloud-bigquery/.kokoro/samples/python3.11/periodic-head.cfg (10)
The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
value: "github/packages/google-cloud-bigquery/.kokoro/test-samples-against-head.sh"
packages/google-cloud-bigquery/.kokoro/samples/python3.12/common.cfg (24)
The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
value: "github/packages/google-cloud-bigquery/.kokoro/test-samples.sh"
packages/google-cloud-bigquery/.kokoro/samples/python3.12/periodic-head.cfg (10)
The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
value: "github/packages/google-cloud-bigquery/.kokoro/test-samples-against-head.sh"
packages/google-cloud-bigquery/.kokoro/samples/python3.13/common.cfg (24)
The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
value: "github/packages/google-cloud-bigquery/.kokoro/test-samples.sh"
packages/google-cloud-bigquery/.kokoro/samples/python3.13/periodic-head.cfg (10)
The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
value: "github/packages/google-cloud-bigquery/.kokoro/test-samples-against-head.sh"
packages/google-cloud-bigquery/.kokoro/samples/python3.14/common.cfg (24)
The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
value: "github/packages/google-cloud-bigquery/.kokoro/test-samples.sh"
packages/google-cloud-bigquery/.kokoro/samples/python3.14/periodic-head.cfg (10)
The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
value: "github/packages/google-cloud-bigquery/.kokoro/test-samples-against-head.sh"
packages/google-cloud-bigquery/.kokoro/samples/python3.9/common.cfg (24)
The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
value: "github/packages/google-cloud-bigquery/.kokoro/test-samples.sh"
packages/google-cloud-bigquery/.kokoro/samples/python3.9/periodic-head.cfg (10)
The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
value: "github/packages/google-cloud-bigquery/.kokoro/test-samples-against-head.sh"
packages/google-cloud-bigquery/.kokoro/test-samples-impl.sh (36)
Explicitly using python3.9 might be too restrictive. Consider using python3 or relying on the PATH to find the appropriate Python interpreter, unless 3.9 is a strict requirement for this specific step.
python3 -m pip install --upgrade --quiet nox virtualenv
packages/google-cloud-bigquery/.kokoro/test-samples-impl.sh (79)
Explicitly using python3.9 might be too restrictive. Consider using python3 or relying on the PATH to find the appropriate Python interpreter, unless 3.9 is a strict requirement for this specific step.
python3 -m nox -s "$RUN_TESTS_SESSION"
packages/google-cloud-bigquery/.repo-metadata.json (10)
The repo field refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
"repo": "googleapis/google-cloud-python/packages/google-cloud-bigquery",
packages/google-cloud-bigquery/CONTRIBUTING.rst (38)
The repository name in the contributing guide refers to the old name. It should be updated to reflect the new package location.
``google-cloud-bigquery`` `repo`_ on GitHub.
packages/google-cloud-bigquery/CONTRIBUTING.rst (41)
The repository name in the contributing guide refers to the old name. It should be updated to reflect the new package location.
- Clone your fork of ``google-cloud-bigquery`` from your GitHub account to your local
packages/google-cloud-bigquery/CONTRIBUTING.rst (50)
The repository name in the contributing guide refers to the old name. It should be updated to reflect the new package location.
# Configure remotes such that you can pull changes from the googleapis/google-cloud-python
packages/google-cloud-bigquery/CONTRIBUTING.rst (63)
The repository name in the contributing guide refers to the old name. It should be updated to reflect the new package location.
version of ``google-cloud-bigquery``. The
packages/google-cloud-bigquery/CONTRIBUTING.rst (210)
The version google-cloud-bigquery==1.28.0 might be outdated. Consider updating it to the latest relevant version or removing the specific version number if it's meant to be a general example.
The last version of this library compatible with Python 2.7 and 3.5 is
`google-cloud-bigquery==2.x.x`.
packages/google-cloud-bigquery/CONTRIBUTING.rst (241)
The URL for the noxfile.py config refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
.. _config: https://github.com/googleapis/google-cloud-python/blob/main/packages/google-cloud-bigquery/noxfile.py
packages/google-cloud-bigquery/README.rst (15)
The URL for general availability points to the monorepo's README. It should be updated to point to the specific google-cloud-bigquery package's GA status within the monorepo, or a more specific GA policy if available.
.. |GA| image:: https://img.shields.io/badge/support-GA-gold.svg
:target: https://github.com/googleapis/google-cloud-python/blob/main/packages/google-cloud-bigquery/README.rst#general-availability
packages/google-cloud-bigquery/README.rst (62)
The version google-cloud-bigquery==1.28.0 might be outdated. Consider updating it to the latest relevant version or removing the specific version number if it's meant to be a general example.
The last version of this library compatible with Python 2.7 and 3.5 is
`google-cloud-bigquery==2.x.x`.
packages/google-cloud-bigquery/benchmark/benchmark.py (73)
The _parse_tag function assumes that a colon will always be present in the tag string. If a tag is provided without a value (e.g., --tag somekeywithnovalue), split(":")will return a list with one element, causing aValueError`. The function should handle tags without values gracefully.
parts = tagstring.split(":", 1)
key = parts[0]
value = parts[1] if len(parts) > 1 else ""packages/google-cloud-bigquery/benchmark/benchmark.py (229)
Comparing datetime.fromisoformat(time_str) directly to datetime.min might be problematic if time_str is not a valid ISO format string or if datetime.min is not the intended sentinel. It's generally safer to check if time_str is an empty string or a specific sentinel value before conversion, or to use a try-except block for fromisoformat.
return time_str == datetime.min.isoformat()packages/google-cloud-bigquery/docs/conf.py (159)
The github_repo setting refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
"github_repo": "google-cloud-python/packages/google-cloud-bigquery",
packages/google-cloud-bigquery/docs/conf.py (241)
The URL for the noxfile.py config refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
.. _config: https://github.com/googleapis/google-cloud-python/blob/main/packages/google-cloud-bigquery/noxfile.pypackages/google-cloud-bigquery/docs/design/query-retries.md (24)
The URL for the api_method parameter refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
was [added via the `api_method`
parameter](https://github.com/googleapis/google-cloud-python/pull/967) and is
packages/google-cloud-bigquery/docs/design/query-retries.md (33)
The URLs for requested features refer to the old repository name or the monorepo. They should be updated to reflect the new package location within the monorepo.
The ability to re-issue a query automatically was a [long](https://github.com/googleapis/google-cloud-python/issues/5555) [requested](https://github.com/googleapis/google-cloud-python/issues/14) [feature](https://github.com/googleapis/google-cloud-python/issues/539). As work ramped up on the SQLAlchemy connector, it became clear that this feature was necessary to keep the test suite, which issues hundreds of queries, from being [too flakey](https://github.com/googleapis/python-bigquery-sqlalchemy/issues?q=is%3Aissue+is%3Aclosed+author%3Aapp%2Fflaky-bot+sort%3Acreated-asc).
packages/google-cloud-bigquery/docs/design/query-retries.md (38)
The URL for re-issuing a query refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
client re-issues a
query](https://github.com/googleapis/google-cloud-python/pull/837) as it was
packages/google-cloud-bigquery/docs/design/query-retries.md (96-97)
The issue URL refers to the old repository name. It should be updated to reflect the new package location within the monorepo.
strictly needed
([Issue #1122](https://github.com/googleapis/google-cloud-python/issues/1122)
has been opened to investigate this).
packages/google-cloud-bigquery/docs/snippets.py (121-124)
The issue URL for flaky update_table() points to the monorepo. It should be updated to point to the specific google-cloud-bigquery package within the monorepo if possible, or be updated to a more general link if the old issue is no longer relevant.
@pytest.mark.skip(
reason=(
"update_table() is flaky "
"https://github.com/googleapis/google-cloud-python/issues/5589?q=is%3Aissue+label%3Acomponent%3Abigquery"
)
)
packages/google-cloud-bigquery/docs/snippets.py (158-161)
The issue URL for flaky update_table() points to the monorepo. It should be updated to point to the specific google-cloud-bigquery package within the monorepo if possible, or be updated to a more general link if the old issue is no longer relevant.
@pytest.mark.skip(
reason=(
"update_table() is flaky "
"https://github.com/googleapis/google-cloud-python/issues/5589?q=is%3Aissue+label%3Acomponent%3Abigquery"
)
)
packages/google-cloud-bigquery/google/cloud/bigquery/init.py (128-136)
The warning about Python 3.7 and 3.8 support might be outdated or too broad, given that the unittest.yml file supports Python 3.9+. Consider updating the warning to reflect the actual supported versions or adjusting the CI configuration if 3.7/3.8 are still intended to be supported with a warning.
if sys_major == 3 and sys_minor in (9, 10):
warnings.warn(
"The python-bigquery library no longer supports Python 3.9 "
"and Python 3.10. "
f"Your Python version is {sys_major}.{sys_minor}.{sys_micro}. We "
"recommend that you update soon to ensure ongoing support. For "
"more details, see: [Google Cloud Client Libraries Supported Python Versions policy](https://cloud.google.com/python/docs/supported-python-versions)",
FutureWarning,
)
See #10980.
This PR should be merged with a merge-commit, not a squash-commit, in order to preserve the git history.